home *** CD-ROM | disk | FTP | other *** search
- <text id=93HT0549>
- <link 90TT2386>
- <link 89TT3077>
- <link 89TT1390>
- <title>
- 1982: Big Dimwits And Little Geniuses
- </title>
- <history>
- TIME--The Weekly Newsmagazine--1982 Highlights
- </history>
- <article>
- <source>Time Magazine</source>
- <hdr>
- January 3, 1983
- MACHINE OF THE YEAR
- Big Dimwits and Little Geniuses
- </hdr>
- <body>
- <p>Yesterday's klutzy machines have become today's micromarvels
- </p>
- <p> The first electronic digital computer in the U.S. unveiled at
- the University of Pennsylvania in 1946, was a collection of
- 18,00 vacuum tubes, 70,000 resistors, 10,000 capacitors and
- 6,000 switches, and occupied the space of a two-car garage. Yet
- ENIAC (for Electronic Numerical Integrator and Calculator) was,
- in retrospect, a dimwit. When it worked, it did so only for
- short bursts because its tubes kept burning out. Built to
- calculate artillery firing tables, the half-million dollar ENIAC
- could perform 5,000 additions or subtractions per second. Today
- almost any home computer, costing only a few hundred dollars,
- can outperform poor old ENIAC as a "number cruncher."
- </p>
- <p> Computer designers have obviously come a long way. But behind
- their spectacular achievements is a colorful history, one
- involving so many characters, so many innovations and such
- wrenching efforts that no single person or even country can
- claim authorship of the computer.
- </p>
- <p> In a sense, humans have been computing--manipulating and
- comparing numbers or anything that they may represent--since
- they first learned how to count, probably with pebbles (the word
- calculus stems from the Latin for stone). At least 2,500 years
- ago, the Chinese, among others, discovered that they could
- handle numbers more easily by sliding little beads on strings.
- Their invention, the abacus, is still in use.
- </p>
- <p> In 1642, perhaps pained by the long hours his tax-collector
- father spent doing sums, a 19-year-old French prodigy named
- Blaise Pascal made an automatic device that could add or
- subtract with the turning of little wheels. But the clerks who
- spent their lives doing calculations in those days viewed
- Pascal's gadget as a job threat, and it never caught on. A short
- time later, the German mathematician Gottfried Wilhelm Leibniz
- added the power of multiplication and division. Said he: "It
- was unworthy of excellent men to lose hours like slaves in the
- labor of calculations..."
- </p>
- <p> But such mechanical contrivances were no more than calculators.
- They could only do arithmetic, and very clumsily at that. The
- first man to conceptualize a true computer, one that would be
- able to do math and much more, was in irascible 19th century
- English mathematician named Charles Babbage. Incensed by the
- inaccuracies he found in the mathematical tables of his time,
- the ingenious Babbage (father of the speedometer, the cowcatcher
- for locomotives and the first reliable life-expectancy tables)
- turned his fertile brain to creating an automaton that could
- rapidly and accurately calculate long lists of functions like
- logarithms. The result was an intricate system of gears and cogs
- called the Difference Engine.
- </p>
- <p> Babbage managed to build only a simple model because the
- craftsmen of the day were unable to machine the precise parts
- required by the contraption. But the temperamental genius soon
- had a bolder concept. He called it the Analytical Engine. Even
- more complex than its predecessor, it had all the essentials of
- a modern computer: a logic center, or what Babbage called the
- "mill," which manipulated data according to certain rules; a
- memory, or "store," for holding information; a control unit for
- carrying out instructions; and the means for getting data into
- and out of the machine. Most important of all, its operating
- procedures could be changed at will: the Analytical Engine was
- programmable.
- </p>
- <p> Babbage worked obsessively on his machine for nearly 40 years.
- Presumable he was the world's first computer "nerd." Until his
- death in 1871, he ground out more and more sketches. The
- Analytical Engine became hopelessly complicated. It required
- thousands of individual wheels, levers and belts, all working
- together in exquisite precision. Few people understood what he
- was doing, with the notable exception of Lord Byron's beautiful
- and mathematically gifted daughter, Ada, the Countess of
- Lovelace, who became Babbage's confidante and public advocate.
- When the government cut off funds for the Analytical Engine, she
- and Babbage tried devising a betting system for recouping the
- money at the track. They lost thousands of pounds.
- </p>
- <p> The Analytical Engine was never built. It would have been as big
- as a football field and probably needed half a dozen steam
- locomotives to power it. But one of its key ideas was soon
- adapted. To feed his machine its instruction, Babbage planned
- to rely on punched cards, like those used to control color and
- designs in the looms developed by the French weaver Joseph Marie
- Jacquard. Ada poetically described the scheme this way: "The
- Analytical Engine weaves algebraical patterns just as the
- Jacquard loom weaves flowers and leaves."
- </p>
- <p> In the U.S., a young engineer named Herman Hollerith persuaded
- the Census Bureau to try the punched-card idea during the
- forthcoming 1890 census. Such personal information as age, sex,
- marital status and race was encoded on cards, which were read
- by electric sensors, and tabulated. Hollerith's equipment worked
- so well that the Census Bureau's clerks occasionally shut if off
- to protect their sinecures. Soon punched cards were widely used
- in office machinery, including that made by a small New York
- firm that absorbed Hollerith's company and became International
- Business Machines.
- </p>
- <p> Babbage's dream of a true computer--one that could solve any
- number of problems--was not realized until the 1930s. In
- Hitler's Germany, an obscure young engineer named Konrad Zuse,
- using the German equivalent of an Erector set for parts and his
- parents' living room as his workshop, built a simple computer
- that could perform a variety of tasks; its descendants
- calculated wing designs for the German aircraft industry during
- World War II. At Bell Telephone Laboratories in the U.S., the
- research arm of AT&T, a mathematician named George Stibitz built
- a similar device in 1939 and even showed how it could do
- calculations over telephone wires. This was the first display
- of remote data processing. During the war a British group,
- putting into practice some of the ideas of their brilliant
- countryman Alan Turing, built a computer called Colossus I that
- helped break German military codes. The British, German and
- U.S. machines all shared a common characteristic: they were the
- first computers to use the binary system of numbers, the
- standard internal language of today's digital computers.
- </p>
- <p> In this they departed from Babbage's "engine." The engine was
- designed to count by the tens, or the decimal system. Employing
- ten digits (0 to 9), the decimal system probably dates from the
- time when humans realized they had ten fingers and ten toes.
- (Digit comes from the Latin for finger or toe.) But there are
- other ways of counting as well, by twelves, say, as in the hours
- of the day or months of the year (duodecimal system). In the
- binary system, only two digits are used, 0 and 1. To create a
- 2, you simply move a column to the left, just as you do to
- create a 10 in the decimal system. Thus if zero is represented
- by 0 and one by 1, then two is 10, three 11, four 100, five 101,
- six 110, seven 11, eight 1000, and so forth.
- </p>
- <p> The binary system is enormously cumbersome. Although any number
- can be represented, it requires exasperatingly long strings of
- 0s and 1s. But putting such a system to work is a snap for
- digital computers. At their most fundamental level, the
- computers are little more than complex maze of on-off switches
- that reduce all information within the machine to one of two
- states: yes (1) or no (0), represented either by the presence
- of an electrical charge at a particular site or the absence of
- one. Accordingly, it in a row of three switches, two of them are
- in an on position (11) and the other off (0), they would
- represent the number six (110).
- </p>
- <p> In the world of digital computers, each of these pieces of
- information is called a bit (for binary digit). In most
- personal computers, bits are shuttled about within the machine
- eight at a time, although some faster 16-bit machines are
- already on the small-computer market and even speedier 32-bit
- machines are in the offing. Clusters of eight bits, forming the
- equivalent of a single letter in ordinary language, are called
- bytes. A typical personal computer offers users anywhere from
- about 16,000 bytes of memory (16K) to 64,000 (64K). But that
- figure is climbing fast. A few years ago, the standard memory
- chip, a quarter-inch square of silicon, was 16K. Today it is
- rapidly becoming 64K, and the industry is already talking of
- mass-producing 256K chips.
- </p>
- <p> The novel idea of using strings of 1s and 0s to solve complex
- problems traces back to another gifted Englishman, George Boole.
- A contemporary of Babbage's, he developed a system of
- mathematical logic that allows problems to be solved by reducing
- them to a series of questions requiring only an answer of true
- or false. Just three logical functions, call AND, OR and NOT,
- are needed to process Boole's "trues" and "falses," or 1s and
- 0s. In computers these operations are performed by simple
- combinations of on-off switches, called logic gates. They pass
- on information, that is pulses of electricity, only according
- to the Boolean rules built within them. Even a small home
- computer has thousands of such gates, each opening and closing
- more than a million times a second, sending bits and bytes of
- information coursing through the circuitry at nearly light's
- velocity (electricity travels about a foot in a billionth of a
- second).
- </p>
- <p> The earliest digital computers were much more plodding. They
- relied on electromechanical on-off switches call relays, which
- physically opened and closed like the old Morse code keys.
- Physicist-Author Jeremy Bernstein recalls that Mark I, IBM's
- first large computer assembled at Harvard during World War II,
- sounded "like a roomful of ladies knitting." I could multiply
- two 23-digit numbers in about five seconds. Even some hand-held
- calculators can now do the same job in a fraction of the time.
- </p>
- <p> ENIAC vastly increased computer speed by using vacuum tubes
- rather than electromechanical relays as its switches, but it
- still had a major shortcoming. To perform different operations,
- it had to be manually rewired, like an old wire-and-plug
- telephone switchboard, a task that could take several days. The
- Hungarian-born mathematical genius, John von Neumann, saw a
- solution. He suggested putting the machine's operating
- instructions, or program, within the same memory as the data to
- be processed and writing it in the same binary language. The
- computer could thus be programmed through the same input devices
- used to feed in data, such as a keyboard or a reel of tape. The
- first commercial computer to have such capability was
- Sperry-Rand's UNIVAC 1, which appeared in 1951 and, much to
- IBM's chagrin at being beaten, was promptly delivered to the
- Census Bureau.
- </p>
- <p> Yet even while journalists were hailing the new "electronic
- superbrains," the machines were already becoming obsolete. In
- 1947 three scientists at Bell Labs invented a tiny, deceptively
- simple device called the transistor (short for transfer
- resistance). It was nothing more than a sandwich of
- semiconducting materials, mostly crystals of germanium; silicon
- became popular later. The crystals were arranged so that a tiny
- current entering one part of the sandwich could control a larger
- current in another. Hence, they could be used as switches,
- controlling the ebb and flow of electrons. Even the earliest
- transistors were much smaller than vacuum tubs, worked faster
- and had fewer failures. They gave off so little heat that they
- could be packed closely together. Above all, they were quite
- cheap to make.
- </p>
- <p> Within a few years, the wizards at Bell Labs built the first
- fully transistorized (or solid-state) computer, a machine
- called Leprechaun. But by then Ma Bell, eager to avoid the wrath
- of the Justice Department's trustbusters, had sold licenses for
- only $25,000 to anyone who wanted to make transistors, and the
- scramble was on to profit from them. William Shockley, one of
- the transistor's three inventors, returned to his California
- home town, Palo Alto, to form his own company in the heart of
- what would become known as Silicon Valley. In Dallas, a young,
- aggressive maker of exploration gear for the oil industry, Texas
- Instruments, had already hired away another Bell Labs star,
- Gordon Teal, and was churning out the little gadgets. So were
- old-line tube makers such as General Electric, RCA, Sylvania and
- Raytheon. Much of their production went to the Pentagon, which
- found transistors ideal for a special computing task: the
- guidance of missiles.
- </p>
- <p> The first computers, even those built with transistors, were
- put together like early radios, with tangles of wires connecting
- each component. But soon electronics manufacturers realized that
- the wiring could be "printed" directly on a board, eliminating
- much of the hand-wiring. Then came another quantum leap into the
- miniworld. In the late 1950s, Texas Instruments' Jack Kilby and
- Fairchild Semiconductor's Robert Noyce (one of eight defectors
- from Shockley's firm whom he scathingly called the "traitorous
- eight") had the same brainstorm. Almost simultaneously, they
- realized that any number of transistors could be etched directly
- on a single piece of silicon along with the connections between
- them. Such integrated circuits (ICs) contained entire sections
- of a computer, for example, a logic circuit or a memory
- register. The microchip was born.
- </p>
- <p> Designers kept cramming in more and more transistors. Today,
- hundreds of thousands can be etched on a tiny silicon chip. The
- chips also began incorporating more circuits. But even such so-
- called large-scale integration had a drawback. With the
- circuits rigidly fixed in the silicon, the chips performed only
- the duties for which they were designed. They were "hardwired,"
- as engineers say. That changed dramatically in 1971, when Intel
- Corp., a Silicon Valley company founded by Noyce after yet
- another "defection," unveiled the microprocessor. Designed by
- a young Intel engineer named Ted Hoff, it contained the entire
- central processing unit (CPU) of a simple computer on one chip.
- It was Babbage's mighty mill in microcosm.
- </p>
- <p> With the microprocessor, a single chip could be programmed to
- do any number of tasks, from running a watch to steering a
- spacecraft. It could also serve as the soul of a new machine:
- the personal computer. By 1975 the first of the new breed of
- computers had appeared, a hobbyist machine called the Altair
- 8800 (cost: $395 in kit form, $621 assembled). The Altair soon
- vanished from the marketplace. But already there were other
- young and imaginative tinkerers out in Silicon Valley getting
- ready to produce personal computers, including one bearing an
- off symbol: an apple with a bite taken out of it. Suddenly, the
- future was now.
- </p>
- <p>-- By Frederic Golden
- </p>
-
- </body>
- </article>
- </text>
-
-